convolution code - translation to
Diclib.com
Online Dictionary

convolution code - translation to

TYPE OF ERROR-CORRECTING CODE USING CONVOLUTION
Convolution code; Constraint length; Recursive Systematic Convolutional code; Convolutional coding; Convolution encoding; Convolution coding; Trellis diagram; Convolutional codes; Convolutional coders; Feed forward Convolutional code; Recursive Convolutional code
  • Shift-register for the (7, [171, 133]) convolutional code polynomial. Branches: <math>h^1 = 171_o = [1111001]_b</math>, <math>h^2 = 133_o = [1011011]_b</math>. All of the math operations should be done by modulo 2.
  • LLR]] Algorithms.<ref>[https://www.mathworks.com/help/comm/examples/llr-vs-hard-decision-demodulation.html LLR vs. Hard Decision Demodulation (MathWorks)]</ref><ref>[https://www.mathworks.com/help/comm/ug/estimate-ber-for-hard-and-soft-decision-viterbi-decoding.html Estimate BER for Hard and Soft Decision Viterbi Decoding (MathWorks)]</ref> (Exact<ref>[https://www.mathworks.com/help/comm/ug/digital-modulation.html#brc6yjx Digital modulation: Exact LLR Algorithm (MathWorks)]</ref> and Approximate<ref>[https://www.mathworks.com/help/comm/ug/digital-modulation.html#brc6ymu Digital modulation: Approximate LLR Algorithm (MathWorks)]</ref>) over additive white Gaussian noise channel.
  • Img.1. Rate 1/3 non-recursive, non-systematic convolutional encoder with constraint length 3
  • Interleaving]] and Deinterleaving - code words separation increasing in time domain and to avoid bursty distortions.
  • increases exponentially]] with constraint lengths, limiting these more powerful codes to deep space missions where the extra performance is easily worth the increased decoder complexity.
  • Theoretical bit-error rate curves of encoded QPSK (recursive and non-recursive, soft decision), additive white Gaussian noise channel. Curves are small distinguished due to approximately the same free distances and weights.
  • Convolutional codes with 1/2 and 3/4 code rates (and constraint length 7, Soft decision, 4-QAM / QPSK / OQPSK).<ref>[https://ch.mathworks.com/help/comm/ug/punctured-convolutional-coding-1.html Punctured Convolutional Coding (MathWorks)]</ref>
  •  A turbo code with component codes 13, 15.<ref>[http://www.scholarpedia.org/article/Turbo_codes Turbo code]</ref> Turbo codes get their name because the decoder uses feedback, like a turbo engine. Permutation means the same as the interleaving. C1 and C2 are recursive convolutional codes. Recursive and non-recursive convolutional codes are not so much different in BER performance, however, recursive type of is implemented in Turbo convolutional codes due to better interleaving properties.<ref>Benedetto, Sergio, and Guido Montorsi. "[https://ieeexplore.ieee.org/abstract/document/390945/ Role of recursive convolutional codes in turbo codes]." Electronics Letters 31.11 (1995): 858-859.</ref>

convolutional coding         

математика

свёрточное кодирование

convolutional code         

вычислительная техника

свёрточный код

convolution code         
сверточный код

Definition

ФРАНЦУЗСКИЙ ГРАЖДАНСКИЙ КОДЕКС
1804 (Кодекс Наполеона) , действующий гражданский кодекс Франции. Составлен при активном участии Наполеона. Включает нормы гражданского, семейного, процессуального, частично трудового права. Кодекс закрепил свободу частной собственности, провозгласив это право священным и неприкосновенным.

Wikipedia

Convolutional code

In telecommunication, a convolutional code is a type of error-correcting code that generates parity symbols via the sliding application of a boolean polynomial function to a data stream. The sliding application represents the 'convolution' of the encoder over the data, which gives rise to the term 'convolutional coding'. The sliding nature of the convolutional codes facilitates trellis decoding using a time-invariant trellis. Time invariant trellis decoding allows convolutional codes to be maximum-likelihood soft-decision decoded with reasonable complexity.

The ability to perform economical maximum likelihood soft decision decoding is one of the major benefits of convolutional codes. This is in contrast to classic block codes, which are generally represented by a time-variant trellis and therefore are typically hard-decision decoded. Convolutional codes are often characterized by the base code rate and the depth (or memory) of the encoder [ n , k , K ] {\displaystyle [n,k,K]} . The base code rate is typically given as n / k {\displaystyle n/k} , where n is the raw input data rate and k is the data rate of output channel encoded stream. n is less than k because channel coding inserts redundancy in the input bits. The memory is often called the "constraint length" K, where the output is a function of the current input as well as the previous K 1 {\displaystyle K-1} inputs. The depth may also be given as the number of memory elements v in the polynomial or the maximum possible number of states of the encoder (typically : 2 v {\displaystyle 2^{v}} ).

Convolutional codes are often described as continuous. However, it may also be said that convolutional codes have arbitrary block length, rather than being continuous, since most real-world convolutional encoding is performed on blocks of data. Convolutionally encoded block codes typically employ termination. The arbitrary block length of convolutional codes can also be contrasted to classic block codes, which generally have fixed block lengths that are determined by algebraic properties.

The code rate of a convolutional code is commonly modified via symbol puncturing. For example, a convolutional code with a 'mother' code rate n / k = 1 / 2 {\displaystyle n/k=1/2} may be punctured to a higher rate of, for example, 7 / 8 {\displaystyle 7/8} simply by not transmitting a portion of code symbols. The performance of a punctured convolutional code generally scales well with the amount of parity transmitted. The ability to perform economical soft decision decoding on convolutional codes, as well as the block length and code rate flexibility of convolutional codes, makes them very popular for digital communications.